Multi-kernel regularized classifiers
نویسندگان
چکیده
A family of classification algorithms generated from Tikhonov regularization schemes are considered. They involve multi-kernel spaces and general convex loss functions. Our main purpose is to provide satisfactory estimates for the excess misclassification error of these multi-kernel regularized classifiers. The error analysis consists of two parts: regularization error and sample error. Allowing multi-kernels in the algorithm improves the regularization error and approximation error, which is one advantage of the multi-kernel setting. For a general loss function, we show how to bound the regularization error by the approximation in some weighted L spaces. For the sample error, we use a projection operator. The projection in connection with the decay of the regularization error enables us to improve convergence rates in the literature even for the one kernel schemes and special loss functions: least square loss and hinge loss for support vector machine soft margin classifiers. Existence of the optimization problem for the regularization scheme associated with multi-kernels is verified when the kernel functions are continuous with respect to the index set. Gaussian kernels with flexible variances and probability distributions with some noise conditions are demonstrated to illustrate the general theory.
منابع مشابه
Regularized Least Squares Piecewise Multi-classification Machine
This paper presents a Tikhonov regularization based piecewise classification model for multi-category discrimination of sets or objects. The proposed model includes a linear classification and nonlinear kernel classification model formulation. Advantages of the regularized multi-classification formulations include its ability to express a multi-class problem as a single and unconstrained optimi...
متن کاملIndoor Localization via Discriminatively Regularized Least Square Classification
In this paper, we address the received signal strength (RSS)-based indoor localization problem in a wireless local area network (WLAN) environment and formulate it as a multi-class classification problem using survey locations as classes. We present a discriminatively regularized least square classifier (DRLSC)-based localization algorithm that is aimed at making use of the class label informat...
متن کاملOnline Learning with Regularized Kernel for One-class Classification
This paper presents an online learning with regularized kernel based one-class extreme learning machine (ELM) classifier and is referred as “online RK-OC-ELM”. The baseline kernel hyperplane model considers whole data in a single chunk with regularized ELM approach for offline learning in case of one-class classification (OCC). Further, the basic hyper plane model is adapted in an online fashio...
متن کاملThe Margin Vector, Admissible Loss and Multi-class Margin-based Classifiers
We propose a new framework to construct the margin-based classifiers, in which the binary and multicategory classification problems are solved by the same principle; namely, margin-based classification via regularized empirical risk minimization. To build the framework, we propose the margin vector which is the multi-class generalization of the margin, then we further generalize the concept of ...
متن کاملClassification with Kernel Mahalanobis Distance Classifiers
Within the framework of kernel methods, linear data methods have almost completely been extended to their nonlinear counterparts. In this paper, we focus on nonlinear kernel techniques based on the Mahalanobis distance. Two approaches are distinguished here. The first one assumes an invertible covariance operator, while the second one uses a regularized covariance. We discuss conceptual and exp...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- J. Complexity
دوره 23 شماره
صفحات -
تاریخ انتشار 2007